Robust Process monitoring via Stable Principal Component Pursuit
نویسندگان
چکیده
منابع مشابه
Efficient algorithms for robust and stable principal component pursuit problems
Abstract. The problem of recovering a low-rank matrix from a set of observations corrupted with gross sparse error is known as the robust principal component analysis (RPCA) and has many applications in computer vision, image processing and web data ranking. It has been shown that under certain conditions, the solution to the NP-hard RPCA problem can be obtained by solving a convex optimization...
متن کاملStable Analysis of Compressive Principal Component Pursuit
Compressive principal component pursuit (CPCP) recovers a target matrix that is a superposition of low-complexity structures from a small set of linear measurements. Pervious works mainly focus on the analysis of the existence and uniqueness. In this paper, we address its stability. We prove that the solution to the related convex programming of CPCP gives an estimate that is stable to small en...
متن کاملRobust Principal Component Analysis by Projection Pursuit
Different algorithms for principal component analysis (PCA) based on the idea of projection pursuit are proposed. We show how the algorithms are constructed, and compare the new algorithms with standard algorithms. With the R implementation pcaPP we demonstrate the usefulness at real data examples. Finally, it will be outlined how the algorithms can be used for robustifying other multivariate m...
متن کاملHyperplane Clustering via Dual Principal Component Pursuit
State-of-the-art methods for clustering data drawn from a union of subspaces are based on sparse and low-rank representation theory. Existing results guaranteeing the correctness of such methods require the dimension of the subspaces to be small relative to the dimension of the ambient space. When this assumption is violated, as is, for example, in the case of hyperplanes, existing methods are ...
متن کاملDUAL PRINCIPAL COMPONENT PURSUIT Dual Principal Component Pursuit
We consider the problem of outlier rejection in single subspace learning. Classical approaches work with a direct representation of the subspace, and are thus efficient when the subspace dimension is small. Our approach works with a dual representation of the subspace and hence aims to find its orthogonal complement; as such it is particularly suitable for high-dimensional subspaces. We pose th...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IFAC-PapersOnLine
سال: 2015
ISSN: 2405-8963
DOI: 10.1016/j.ifacol.2015.09.036